The new spectral conjugate gradient method for large-scale unconstrained optimisation

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Self-Adjusting Spectral Conjugate Gradient Method for Large-Scale Unconstrained Optimization

and Applied Analysis 3 Additionally, we assume that there exist positive constants γ and γ such that 0 < γ ≤ 󵄩󵄩󵄩gk 󵄩󵄩󵄩 ≤ γ, ∀k ≥ 1, (21) then we have the following result. Theorem2. Consider the method (2), (8) and (12), where d k is a descent direction. If (21) holds, there exist positive constants ξ 1 , ξ 2 , and ξ 3 such that relations

متن کامل

A new hybrid conjugate gradient algorithm for unconstrained optimization

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

New Parallel Conjugate Directions Method for Unconstrained Optimisation

In this report we present a new nongradient method for unconstrained minimisation which is designed for implementation on a parallel MIMD computer. This algorithm, developed from the classical serial algorithm of Smith [21] , also generates conjugate directions and has guaranteed quadratic convergence. The underlying formal theory whereby conjugate directions may be generated by the new method ...

متن کامل

A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization

In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Inequalities and Applications

سال: 2020

ISSN: 1029-242X

DOI: 10.1186/s13660-020-02375-z